Orion
path: orion.primitives.vae.VAE
orion.primitives.vae.VAE
description: this is a reconstruction model using Variational AutoEncoder.
see json.
argument
type
description
parameters
X
numpy.ndarray
n-dimensional array containing the input sequences for the model
y
n-dimensional array containing the target sequences we want to reconstruct. Typically y is a signal from a selected set of channels from X.
hyperparameters
epochs
int
number of epochs to train the model. An epoch is an iteration over the entire X data provided
input_shape
tuple
tuple denoting the shape of an input sample
output_shape
tuple denoting the shape of an output sample
latent_dim
integer denoting dimension of latent space. Default is 20.
learning_rate
float
float denoting the learning rate of the optimizer. Default is 0.001
optimizer
str
string (name of optimizer) or optimizer instance. Default is keras.optimizers.Adam
keras.optimizers.Adam
batch_size
number of samples per gradient update. Default is 64
shuffle
bool
whether to shuffle the training data before each epoch. Default is True.
verbose
verbosity mode where 0 = silent, 1 = progress bar, 2 = one line per epoch. Default is 0.
lstm_units
number of neurons (dimensionality of the output space).
length
equal to input_shape[0].
callbacks
list
list of keras.callbacks.Callback instances. List of callbacks to apply during training.
keras.callbacks.Callback
validation_split
fraction of the training data to be used as validation data. Default 0.
output_dim
equal to output_shape[-1]
layers_encoder
list containing layers of encoder
layers_generator
list containing layers of generator
output
predicted values
In [1]: import numpy as np In [2]: from mlstars import load_primitive In [3]: X = np.array([1] * 100).reshape(1, -1, 1) In [4]: primitive = load_primitive('orion.primitives.vae.VAE', ...: arguments={"X": X, "y": X, "input_shape":(100, 1), "output_shape":(100, 1), ...: "validation_split": 0, "batch_size": 1, "epochs": 5}) ...: In [5]: primitive.fit() Epoch 1/5 1/1 [==============================] - ETA: 0s - loss: 1.0744 1/1 [==============================] - 2s 2s/step - loss: 1.0744 Epoch 2/5 1/1 [==============================] - ETA: 0s - loss: 1.0115 1/1 [==============================] - 0s 22ms/step - loss: 1.0115 Epoch 3/5 1/1 [==============================] - ETA: 0s - loss: 0.9731 1/1 [==============================] - 0s 21ms/step - loss: 0.9731 Epoch 4/5 1/1 [==============================] - ETA: 0s - loss: 0.7514 1/1 [==============================] - 0s 19ms/step - loss: 0.7514 Epoch 5/5 1/1 [==============================] - ETA: 0s - loss: 0.5449 1/1 [==============================] - 0s 20ms/step - loss: 0.5449 In [6]: pred = primitive.produce(X=X) 1/1 [==============================] - ETA: 0s 1/1 [==============================] - 1s 504ms/step In [7]: pred.mean() Out[7]: 0.24158257084611784