Need Help with precise-lite Custom Trigger Phrase

Hi,

I previously had a custom trigger phrase (“computer”) using precise-engine back when I had Mycroft. I am attempting to get it working on OVOS now using precise-lite-trainer. I have an error I’m not sure how to solve. To quickly walk through how I solved prior errors (in case I still did something wrong there), I had the following error:

ValueError: The filepath provided must end in .keras (Keras model format). Received: filepath=computer/model/computer

I attempted to solve the error by creating an empty zip file ending in .keras instead of .zip. I had also gotten a different error, which I didn’t record, but I solved that by removing the use_multiprocessing=True line in train.py. I’m sure there’s a better way to handle that error. I’m open to suggestions on that, but it isn’t my primary error.

The main error I need help on is KeyError: "There is no item named 'config.json' in the archive". I don’t see any information on how to create a .json file anywhere. I don’t have much knowledge on these machine learning libaries, at least for this type of usage with keras, etc. I attempted to glean information I could about how the machine learning model was structured and had AI attempt to create its best guess for a JSON file. Below is the error I get from the attempted JSON file guestimated by AI:

Loading from computer/model/computer.keras...
Traceback (most recent call last):
  File "/home/pi/precise-lite-trainer/training.py", line 9, in <module>
    trainer = PreciseTrainer(model_path, folder, epochs=100, log_dir=log_dir)
              ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/pi/precise-lite-trainer/precise_trainer/train.py", line 75, in __init__
    self.model = get_model(model, model_params)
                 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/pi/precise-lite-trainer/precise_trainer/model.py", line 56, in get_model
    model = load_model(model_name)
            ^^^^^^^^^^^^^^^^^^^^^^
  File "/home/pi/.venvs/training/.venv/lib/python3.11/site-packages/keras/src/saving/saving_api.py", line 189, in load_model
    return saving_lib.load_model(
           ^^^^^^^^^^^^^^^^^^^^^^
  File "/home/pi/.venvs/training/.venv/lib/python3.11/site-packages/keras/src/saving/saving_lib.py", line 367, in load_model
    return _load_model_from_fileobj(
           ^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/pi/.venvs/training/.venv/lib/python3.11/site-packages/keras/src/saving/saving_lib.py", line 444, in _load_model_from_fileobj
    model = _model_from_config(
            ^^^^^^^^^^^^^^^^^^^
  File "/home/pi/.venvs/training/.venv/lib/python3.11/site-packages/keras/src/saving/saving_lib.py", line 433, in _model_from_config
    model = deserialize_keras_object(
            ^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/pi/.venvs/training/.venv/lib/python3.11/site-packages/keras/src/saving/serialization_lib.py", line 694, in deserialize_keras_object
    cls = _retrieve_class_or_fn(
          ^^^^^^^^^^^^^^^^^^^^^^
  File "/home/pi/.venvs/training/.venv/lib/python3.11/site-packages/keras/src/saving/serialization_lib.py", line 803, in _retrieve_class_or_fn
    raise TypeError(
TypeError: Could not locate class 'Sequential'. Make sure custom classes are decorated with `@keras.saving.register_keras_serializable()`. Full object config: {'class_name': 'Sequential', 'config': {'name': 'sequential', 'layers': [{'class_name': 'GRU', 'config': {'name': 'net', 'trainable': True, 'batch_input_shape': [None, 29, 13], 'dtype': 'float32', 'units': 20, 'activation': 'tanh', 'recurrent_activation': 'sigmoid', 'use_bias': True, 'kernel_initializer': {'class_name': 'GlorotUniform', 'config': {'seed': None}}, 'recurrent_initializer': {'class_name': 'Orthogonal', 'config': {'gain': 1.0, 'seed': None}}, 'bias_initializer': {'class_name': 'Zeros', 'config': {}}, 'kernel_regularizer': None, 'recurrent_regularizer': None, 'bias_regularizer': None, 'activity_regularizer': None, 'kernel_constraint': None, 'recurrent_constraint': None, 'bias_constraint': None, 'dropout': 0.0, 'recurrent_dropout': 0.0, 'return_sequences': False, 'return_state': False, 'go_backwards': False, 'stateful': False, 'unroll': False, 'time_major': False, 'reset_after': False, 'zero_output_for_mask': False}}, {'class_name': 'Dense', 'config': {'name': 'dense', 'trainable': True, 'dtype': 'float32', 'units': 1, 'activation': 'linear', 'use_bias': True, 'kernel_initializer': {'class_name': 'GlorotUniform', 'config': {'seed': None}}, 'bias_initializer': {'class_name': 'Zeros', 'config': {}}, 'kernel_regularizer': None, 'bias_regularizer': None, 'activity_regularizer': None, 'kernel_constraint': None, 'bias_constraint': None}}]}, 'keras_version': '3.8.0', 'backend': 'tensorflow'}

Here’s the actual JSON file I attempted with precise-lite:

{
  "class_name": "Sequential",
  "config": {
    "name": "sequential",
    "layers": [
      {
        "class_name": "GRU",
        "config": {
          "name": "net",
          "trainable": true,
          "batch_input_shape": [null, 29, 13],
          "dtype": "float32",
          "units": 20,
          "activation": "tanh",
          "recurrent_activation": "sigmoid",
          "use_bias": true,
          "kernel_initializer": {
            "class_name": "GlorotUniform",
            "config": {
              "seed": null
            }
          },
          "recurrent_initializer": {
            "class_name": "Orthogonal",
            "config": {
              "gain": 1.0,
              "seed": null
            }
          },
          "bias_initializer": {
            "class_name": "Zeros",
            "config": {}
          },
          "kernel_regularizer": null,
          "recurrent_regularizer": null,
          "bias_regularizer": null,
          "activity_regularizer": null,
          "kernel_constraint": null,
          "recurrent_constraint": null,
          "bias_constraint": null,
          "dropout": 0.0,
          "recurrent_dropout": 0.0,
          "return_sequences": false,
          "return_state": false,
          "go_backwards": false,
          "stateful": false,
          "unroll": false,
          "time_major": false,
          "reset_after": false,
          "zero_output_for_mask": false
        }
      },
      {
        "class_name": "Dense",
        "config": {
          "name": "dense",
          "trainable": true,
          "dtype": "float32",
          "units": 1,
          "activation": "linear",
          "use_bias": true,
          "kernel_initializer": {
            "class_name": "GlorotUniform",
            "config": {
              "seed": null
            }
          },
          "bias_initializer": {
            "class_name": "Zeros",
            "config": {}
          },
          "kernel_regularizer": null,
          "bias_regularizer": null,
          "activity_regularizer": null,
          "kernel_constraint": null,
          "bias_constraint": null
        }
      }
    ]
  },
  "keras_version": "3.8.0",
  "backend": "tensorflow"
}

I likely have some stuff wrong because I gave the AI as much information as I knew about what I was doing, but some stuff was inferred and likely incorrect.

Here’s the script I tried using to actually run precise-trainer:

from precise_trainer import PreciseTrainer

model_name = "computer"
folder = f"computer/{model_name}"  # dataset here
model_path = f"computer/model/{model_name}.keras"  # save here
log_dir = f"./logs/fit/{model_name}"  # for tensorboard

# train a model
trainer = PreciseTrainer(model_path, folder, epochs=100, log_dir=log_dir)
model_file = trainer.train()
# Data: <TrainData wake_words=155 not_wake_words=89356 test_wake_words=39 test_not_wake_words=22339>
# Loading wake-word...
# Loading not-wake-word...
# Loading wake-word...
# Loading not-wake-word...
# Inputs shape: (81602, 29, 13)
# Outputs shape: (81602, 1)
# Test inputs shape: (20486, 29, 13)
# Test outputs shape: (20486, 1)
# Model: "sequential"
# _________________________________________________________________
#  Layer (type)                Output Shape              Param #   
# =================================================================
#  net (GRU)                   (None, 20)                2100      
#                                                                  
#  dense (Dense)               (None, 1)                 21        
#                                                                  
# =================================================================
# Total params: 2,121
# Trainable params: 2,121
# Non-trainable params: 0
# .....
# _________________________________________________________________
# Epoch 1280/1379
# 157/160 [============================>.] - ETA: 0s - loss: 0.0308 - accuracy: 0.9868
# ....
# Wrote to /home/miro/PycharmProjects/ovos-audio-classifiers/trained/hey_computer/model.tflite
trainer.test()

# === Counts ===
# False Positives: 2
# True Negatives: 20445
# False Negatives: 2
# True Positives: 37
# 
# === Summary ===
# 20482 out of 20486
# 99.98%
# 
# 0.01% false positives
# 5.13% false negatives

Please advise on what I need to do to get things working. I honestly don’t know what I’m doing and I’m aimlessly guessing at this point. Thanks!

Update: I ended up forking precise-lite-trainer and had AI make a few tweaks based on the errors I was getting and now I’m able to successfully get a .tflite file after running my training.py file in my virtual environment using precise_lite_runner_0.4.2.

However, when I try to use it, I am not getting any results with the wakeword detection. Did I get my hotwords entry syntax correct in mycroft.conf? :

"computer": {
        "module": "ovos-ww-plugin-precise-lite",
        "local_model_file": "/home/pi/.local/share/mycroft/trigger_phrases/computer.tflite",
        "expected_duration": 3,
        "trigger_level": 3,
        "sensitivity": 0.31,
        "listen": true,
        "fallback_ww": "hey_mycroft"
},

Any idea why it might not work? I’m not sure if it was because of the .tflite model created by my fork, or incorrectly adding it to the mycroft.conf configuration. Is there a reason I would need to recreate my dataset, even though it’s based on what I used with Mycroft, just re-trained to be a precise-lite model?