Neural network tools
Build arbitrarily deep networks.
layers should be a list or tuple of integers, that indicate how many neurons the layers should have. bias and outputbias are flags to indicate whether the network should have the corresponding biases; both default to True.
To adjust the classes for the layers use the hiddenclass and outclass parameters, which expect a subclass of NeuronLayer.
If the recurrent flag is set, a RecurrentNetwork will be created, otherwise a FeedForwardNetwork.
If the fast flag is set, faster arac networks will be used instead of the pybrain implementations.
Learns to numerically predict the targets of a set of data, with optional online progress plots.
Initialize with the training data set DS. All keywords given are set as member variables. The following are particularly important:
Key hidden: | number of hidden units |
---|---|
Key tds: | test data set for checking convergence |
Key vds: | validation data set for final performance evaluation |
Key epoinc: | number of epochs to train for, before checking convergence (default: 5) |
Learns to classify a set of data, with optional online progress plots.
Dataset tools
Converts a sequential classification dataset into time windows of fixed length. Assumes the correct class is given at the last timestep of each sequence. Incomplete windows at the sequence end are pruned. No overlap between windows.
Parameters: |
|
---|
Training performance validation tools
This class provides methods for the validation of calculated output values compared to their destined target values. It does not know anything about modules or other pybrain stuff. It just works on arrays, hence contains just the core calculations.
The class has just classmethods, as it is used as kind of namespace instead of an object definition.
Returns the explained sum of squares (ESS).
Parameters: |
|
---|
Returns the mean squared error. The multidimensional arrays will get flattened in order to compare them.
Parameters: |
|
---|---|
Key importance: | each squared error will be multiplied with its corresponding importance value. After summing up these values, the result will be divided by the sum of all importance values for normalization purposes. |
Returns the hit rate of the outputs compared to the targets.
Parameters: |
|
---|
This class provides methods for the validation of calculated output values compared to their destined target values. It especially handles pybrains modules and dataset classes. For the core calculations, the Validator class is used.
The class has just classmethods, as it is used as kind of namespace instead of an object definition.
Returns the mean squared error.
Parameters: |
|
---|
Calculates the module’s output on the dataset. Can be called with any type of dataset.
Parameter: | dataset – Any Dataset object containing an ‘input’ field. |
---|
Returns the hit rate of the module’s output compared to the targets stored inside dataset.
Parameters: |
|
---|
Abstract validate function, that is heavily used by this class. First, it calculates the module’s output on the dataset. In advance, it compares the output to the target values of the dataset through the valfunc function and returns the result.
Parameters: |
|
---|
Class for crossvalidating data. An object of CrossValidator must be supplied with a trainer that contains a module and a dataset. Then the dataset ist shuffled and split up into n parts of equal length.
A clone of the trainer and its module is made, and trained with n-1 parts of the split dataset. After training, the module is validated with the n’th part of the dataset that was not used during training.
This is done for each possible combination of n-1 dataset pieces. The the mean of the calculated validation results will be returned.
Set the specified member variables.
Key max_epochs: | maximum number of epochs the trainer should train the module for. |
---|---|
Key verbosity: | set verbosity level |
Auxiliary functions