memilio.surrogatemodel.ode_secir_simple.network_architectures
Functions
|
CNN Network which uses multiple time steps as input and returns the 8 compartments for multiple time step in the future. |
|
LSTM Network which uses multiple time steps as input and returns the compartments for multiple time step in the future. |
LSTM Network which uses multiple time steps as input and returns the 8 compartments for one single time step in the future. |
|
|
Simple MLP Network which takes the compartments for multiple time steps as input and returns the 8 compartments for multiple time steps. |
|
Simple MLP Network which takes the compartments for multiple time steps as input and returns the 8 compartments for one single time step. |
- memilio.surrogatemodel.ode_secir_simple.network_architectures.cnn_multi_input_multi_output(
- label_width,
- conv_size=3,
- num_outputs=8,
- num_filters=256,
- num_hidden_layers=1,
- num_neurons_per_layer=256,
- activation='relu',
CNN Network which uses multiple time steps as input and returns the 8 compartments for multiple time step in the future.
Input and output have shape [number of expert model simulations, time points in simulation, number of individuals in infection states]. The parameter conv_size describes the kernel_size of the 1d Conv layer. We also use the parameter in combination with a lambda layer to transform the input to shape [batch, CONV_WIDTH, features].
- Parameters:
label_width – Number of time steps in the output.
conv_size – Default: 3] Convolution kernel width which is 3 per default.
num_outputs – Default: 8] Number of compartments. Default value is reached when aggregating the confirmed compartments.
num_filters – Number of different filters used in the Conv1D-Layer
num_hidden_layers – Number of layers in the dense network following the convolution layer
num_neurons_per_layer – Number of neurons in each of the hidden layers (except the output layer)
activation – activation function used in the hidden MLP-layers.
- Returns:
tensorflow keras model with the desired CNN architecture
- memilio.surrogatemodel.ode_secir_simple.network_architectures.lstm_multi_input_multi_output(
- label_width,
- num_outputs=8,
- internal_dimension=32,
- num_hidden_layers=1,
- num_neurons_per_layer=32,
- activation='relu',
LSTM Network which uses multiple time steps as input and returns the compartments for multiple time step in the future.
Input and output have shape [number of expert model simulations, time points in simulation, number of individuals in infection states].
- Parameters:
label_width – Number of time steps in the output.
num_outputs – Default: 8 Number of compartments. Default value is reached when aggregating the confirmed compartments.
internal_dimension – Output dimension of the LSTM-layer.
num_hidden_layers – Number of hidden layers in the dense network
num_neurons_per_layer – Number of neurons per hidden layer
activation – Name of the used activation function
- Returns:
tensorflow keras model with the desired LSTM architecture
- memilio.surrogatemodel.ode_secir_simple.network_architectures.lstm_network_multi_input_single_output(
- num_outputs=8,
- internal_dimension=32,
- num_hidden_layers=1,
- num_neurons_per_layer=32,
- activation='relu',
LSTM Network which uses multiple time steps as input and returns the 8 compartments for one single time step in the future.
Input and output have shape [number of expert model simulations, time points in simulation, number of individuals in infection states].
- Parameters:
num_outputs – Default: 8 Number of compartments. Default value is reached when aggregating the confirmed compartments.
internal_dimension – Output dimension of the LSTM-layer.
num_hidden_layers – Number of hidden layers in the dense network.
num_neurons_per_layer – Number of neurons per hidden layer.
activation – Name of the used activation function.
- Returns:
tensorflow keras model with the desired LSTM architecture
- memilio.surrogatemodel.ode_secir_simple.network_architectures.mlp_multi_input_multi_output(
- label_width,
- num_outputs=8,
- num_hidden_layers=3,
- num_neurons_per_layer=32,
- activation='relu',
Simple MLP Network which takes the compartments for multiple time steps as input and returns the 8 compartments for multiple time steps.
Reshaping adds an extra dimension to the output, so the shape of the output is 1xnum_outputs. This makes the shape comparable to that of the multi-output models.
- Parameters:
label_width – Number of time steps in the output.
num_outputs – Default: 8 Number of compartments. Default value is reached when aggregating the confirmed compartments.
num_hidden_layers – Number of hidden dense layers in the MLP architecture
num_neurons_per_layer – Number of neurons per hidden layer
activation – name of the used activation function
- Returns:
tensorflow keras model with the desired MLP architecture
- memilio.surrogatemodel.ode_secir_simple.network_architectures.mlp_multi_input_single_output(
- num_outputs=8,
- num_hidden_layers=3,
- num_neurons_per_layer=32,
- activation='relu',
Simple MLP Network which takes the compartments for multiple time steps as input and returns the 8 compartments for one single time step.
Reshaping adds an extra dimension to the output, so the shape of the output is 1x8. This makes the shape comparable to that of the multi-output models.
- Parameters:
num_outputs – Default: 8 Number of compartments. Default value is reached when aggregating the confirmed compartments.
num_hidden_layers – Number of hidden dense layers in the MLP architecture
num_neurons_per_layer – Number of neurons per hidden layer
activation – name of the used activation function
- Returns:
tensorflow keras model with the desired MLP architecture