2005-04-19

Basic Predictor

I was thinking on this predictor. And I got to a pretty simple solution.

I will try to explain it, and show how it builds up. So let's start.

The simplest predictor ever

Let's start with something simple like:

class Predictor

{

public:

int input;

int output;



void ProcessInput()

{

output = input;

}

}



This simple class does not learn nothing, nor it predicts anything. It just repeats on it's output whatever in put it gets. It's a repeater: if you give him the input stream 1,2,1,2,1,2 you'll get 1,2,1,2,1,2 as output.

The transcoder

Let's add a dictionary to the repeater:


typedef std::map<int, int> Dictionary;



class Predictor

{

public:

int input;

int output;

private:

Dictionary dictionary;

public:

void ProcessInput()

{

Dictionary::const_iterator it = dictionary.find(input);

// check if we already have this input

if (it == dictionary.end())

{

// if this is a new input, add it to the dictionary

int new_output = rand();

std::pair<Dictionary::const_iterator, bool> result = dictionary.insert(std::pair<int, int>(input, new_output));

it = result.first;

}



output = (*it).second;

}

}



Now, this does something. For each input it receives, it remembers, transforms to something else and outputs this new thing. If it receives the same input, it will produce the same output: feeding him 1,2,3,1,2,3,1,2,3 will produce something like 42, 43, 44, 42, 43, 44, 42, 43, 44. (the actual output values are randomly generated for now, but I don't think it matters).

Learning spatial patterns

We can do another trick: what if we don't get a single int as input, but a list of int's? Something like:


typedef std::vector<int> Input;

typedef std::map<Input, int> Dictionary;



class Predictor

{

public:

Input input;

int output;

private:

Dictionary dictionary;

public:

void ProcessInput()

{

Dictionary::const_iterator it = dictionary.find(input);

// check if we already have this input

if (it == dictionary.end())

{

// if this is a new input, add it to the dictionary

int new_output = rand();

std::pair<Dictionary::const_iterator, bool> result = dictionary.insert(std::pair<Input, int>(input, new_output));

it = result.first;

}



output = (*it).second;

}

}



What we get? We get a thing which receives patterns and produces a single output representing this pattern.

What this thing learns are spatial patterns as they are described in the On Intelligence book: giving it an input [1,2,3] [1,2,3] [2,3,4] [2,3,4] it will produce an output something like 42, 42, 43, 43.

Learning Temporal Patterns

Another trick we can do to the transceiver is adding a buffer to keep the incoming inputs:


typedef std::vector<int> History;

typedef std::map<History, int> Dictionary;





class Predictor

{

public:

int input;

int output;

private:

History history;

Dictionary dictionary;

public:

void ProcessInput()

{

history.push_back(input);

Dictionary::const_iterator it = dictionary.find(history);

// check if we already have this input

if (it == dictionary.end())

{

// if this is a new input, add it to the dictionary

int new_output = rand();

std::pair<Dictionary::const_iterator, bool> result = dictionary.insert(std::pair<History, int>(history, new_output));

it = result.first;

output = (*it).second;

history.clear();

}



}

}



What this thing does? It learns simple streams of inputs.

Let's see a scenario. Assume that the following inputs are fed to this thing: 1, 2, 1, 2, 1, 2, 1, 2.

When it receives the first 1, it immediately adds to the dictionary and outputs 42.
Then it sees a 2, adds to the Dictionary and outputs 43.
Then it sees another 1. It finds it in the history, and does nothing.
Then it receives the second 2, adds to the History (which becomes (1,2)), looks it up, it does not find it, adds it and emits 44.
The next 2 signals are received and processed without any output, but changing the history to (1,2).
Then a new 1 comes, produces a new unseen history of (1,2,1), which is added to the dictionary and 45 is emitted.
And this process continues, producing a dictionary which looks like:
(1) = 42
(2) = 43
(1,2) = 44
(1,2,1) = 45
(2,1) = 46
(2,1,2) = 47
(1,2,1,2) = 48
(2,1,2,1) = 49
and so on...

An interesting change is to place a limit on the size of the history. If we decide that the history cannot be more than 3 inputs, and if the history is full, we emit it's mapping, we end up with the dictionary containing
(1) = 42
(2) = 43
(1,2) = 44
(1,2,1) = 45
(2,1) = 46
(2,1,2) = 47
and the output for a continuous 1,2,1,2,... being
(1)42, (2)43, (1,2)44, (1,2,1)45, (2,1)46, (2,1,2)47, (1,2,1)45, (2,1,2)47, (1,2,1)45, (2,1,2)47 and so on.

Our little thing recognizes and encodes the simple temporal patterns in it's input.

Combining the spatial and temporal pattern recognition

Out logical next step is to combine the history and the wide input:


typedef std::vector<int> Input;

typedef std::vector<Input> History;

typedef std::map<History, int> Dictionary;



class Predictor

{

public:

Input input;

int output;



private:

History history;

Dictionary dictionary;



public:

ProcessInput()

{

output = 0;

history.push_back(input);



Dictionary::const_iterator it = dictionary.find(history);

if (it == dictionary.end())

{

int new_output = rand();

std::pair<Dictionary::const_iterator, bool> result = dictionary.insert(std::pair<History, int>(history, new_output));



it = result.first;

history.clear();

output = (*it).second;

}

}



I say this is one half of the Predictor.
The other half is surprisingly symmetric to this (includes the prediction thing).

Is this simple? Yes. Is this too simple? Dunno.

It basically does what a predictor should do.

Next steps are:
1. linking them together
2. adding the prediction part

0 Comments:

Post a Comment

Links to this post:

Create a Link

<< Home