Submodel Component Description
Bidirectional LSTM Layer Activation used = Softplus, Recurrent Dropout used.
Dense Layer (Non-output) Activation used = SELU, Activity Regularizer l1 and l2 used.
Dense Layer (Output) No activation or Activity Regularization used
Optimizer Nadam with MAPE as the loss function
Call Backs Early Stopping at 50 epochs