I have trained a model in python using model
reg = 0.000001
model = Sequential()
model.add(Dense(24, activation='tanh', name='input_dense', input_shape=input_shape))
model.add(GRU(24, activation='tanh', recurrent_activation='sigmoid', return_sequences=True, kernel_regularizer=regularizers.l2(reg), recurrent_regularizer=regularizers.l2(reg), reset_after=False))
model.add(Flatten())
model.add(Dense(2, activation='softmax'))
But when I converted this model using "tensorflowjs_converter --input_format keras" and loaded in browser getting error
Unhandled Rejection (Error): Unknown regularizer: L2. This may be due to one of the following reasons:
- The regularizer is defined in Python, in which case it needs to be ported to TensorFlow.js or your JavaScript code.
- The custom regularizer is defined in JavaScript, but is not registered properly with tf.serialization.registerClass().
The model.json file content is
{
 "format": "layers-model",
 "generatedBy": "keras v2.4.0",
 "convertedBy": "TensorFlow.js Converter v2.3.0",
 "modelTopology": {
  "keras_version": "2.4.0",
  "backend": "tensorflow",
  "model_config": {
   "class_name": "Sequential",
   "config": {
    "name": "sequential",
    "layers": [
     {
      "class_name": "InputLayer",
      "config": {
       "batch_input_shape": [null, 22, 13],
       "dtype": "float32",
       "sparse": false,
       "ragged": false,
       "name": "input_dense_input"
      }
     },
     {
      "class_name": "Dense",
      "config": {
       "name": "input_dense",
       "trainable": true,
       "batch_input_shape": [null, 22, 13],
       "dtype": "float32",
       "units": 24,
       "activation": "tanh",
       "use_bias": true,
       "kernel_initializer": {
        "class_name": "GlorotUniform",
        "config": { "seed": null }
       },
       "bias_initializer": { "class_name": "Zeros", "config": {} },
       "kernel_regularizer": null,
       "bias_regularizer": null,
       "activity_regularizer": null,
       "kernel_constraint": null,
       "bias_constraint": null
      }
     },
     {
      "class_name": "GRU",
      "config": {
       "name": "gru",
       "trainable": true,
       "dtype": "float32",
       "return_sequences": true,
       "return_state": false,
       "go_backwards": false,
       "stateful": false,
       "unroll": false,
       "time_major": false,
       "units": 24,
       "activation": "tanh",
       "recurrent_activation": "sigmoid",
       "use_bias": true,
       "kernel_initializer": {
        "class_name": "GlorotUniform",
        "config": { "seed": null }
       },
       "recurrent_initializer": {
        "class_name": "Orthogonal",
        "config": { "gain": 1.0, "seed": null }
       },
       "bias_initializer": { "class_name": "Zeros", "config": {} },
       "kernel_regularizer": {
        "class_name": "L2",
        "config": { "l2": 9.999999974752427e-7 }
       },
       "recurrent_regularizer": {
        "class_name": "L2",
        "config": { "l2": 9.999999974752427e-7 }
       },
       "bias_regularizer": null,
       "activity_regularizer": null,
       "kernel_constraint": null,
       "recurrent_constraint": null,
       "bias_constraint": null,
       "dropout": 0.0,
       "recurrent_dropout": 0.0,
       "implementation": 2,
       "reset_after": false
      }
     },
     {
      "class_name": "Flatten",
      "config": {
       "name": "flatten",
       "trainable": true,
       "dtype": "float32",
       "data_format": "channels_last"
      }
     },
     {
      "class_name": "Dense",
      "config": {
       "name": "dense",
       "trainable": true,
       "dtype": "float32",
       "units": 2,
       "activation": "softmax",
       "use_bias": true,
       "kernel_initializer": {
        "class_name": "GlorotUniform",
        "config": { "seed": null }
       },
       "bias_initializer": { "class_name": "Zeros", "config": {} },
       "kernel_regularizer": null,
       "bias_regularizer": null,
       "activity_regularizer": null,
       "kernel_constraint": null,
       "bias_constraint": null
      }
     }
    ]
   }
  },
  "training_config": {
   "loss": "categorical_crossentropy",
   "metrics": ["accuracy"],
   "weighted_metrics": null,
   "loss_weights": null,
   "optimizer_config": {
    "class_name": "Nadam",
    "config": {
     "name": "Nadam",
     "learning_rate": 0.0020000000949949026,
     "decay": 0.004000000189989805,
     "beta_1": 0.8999999761581421,
     "beta_2": 0.9990000128746033,
     "epsilon": 1e-7
    }
   }
  }
 },
 "weightsManifest": [
  {
   "paths": ["group1-shard1of1.bin"],
   "weights": [
    { "name": "dense/kernel", "shape": [528, 2], "dtype": "float32" },
    { "name": "dense/bias", "shape": [2], "dtype": "float32" },
    { "name": "gru/gru_cell/kernel", "shape": [24, 72], "dtype": "float32" },
    {
     "name": "gru/gru_cell/recurrent_kernel",
     "shape": [24, 72],
     "dtype": "float32"
    },
    { "name": "gru/gru_cell/bias", "shape": [72], "dtype": "float32" },
    { "name": "input_dense/kernel", "shape": [13, 24], "dtype": "float32" },
    { "name": "input_dense/bias", "shape": [24], "dtype": "float32" }
   ]
  }
 ]
}
Option 1
There are no classes L1 and L2; they are just interfaces(more here)
There is a class L1L2 which will take the config and return the right regularizer. You can manually replace all occurences of L2 to L1L2.
Option 2
Register a class L2
class L2 {
    static className = 'L2';
    constructor(config) {
       return tf.regularizers.l1l2(config)
    }
}
tf.serialization.registerClass(L2);
// now load the model
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With