TFLite MicroInterpreter() never returns on Arduino Nano BLE 33 Sense #26
Open
Description
I am trying out the gesture recognization example using the IMU_Classifier example for detecting a "punch" gesture.
I see that the Arduino setup() code is kind of blocked when trying to initialize a MicroInterpreter() instance as below.
// Create a static memory buffer for TFLM, the size may need to
// be adjusted based on the model you are using
constexpr int tensorArenaSize = 8 * 1024;
byte tensorArena[tensorArenaSize];
tfLiteInterpreter = new tflite::MicroInterpreter(tfLiteModel,tfLiteOpsResolver, tensorArena, tensorArenaSize, &tfLiteErrorReporter);
Could you please share if this could be due to model size or any other dependency not being met? I gave it a try by shrinking model layers to reduce the model size as below but I still see an issue. Here, I am only trying to classify the punch gesture.
`# build the model and train it
model = tf.keras.Sequential()
model.add(tf.keras.layers.Dense(16, activation='relu'))
model.add(tf.keras.layers.Dense(8, activation='relu'))
model.add(tf.keras.layers.Dense(NUM_GESTURES, activation='sigmoid')) # softmax is used, because we only expect one gesture to occur per input
model.compile(optimizer='rmsprop', loss='mse', metrics=['mae'])
history = model.fit(inputs_train, outputs_train, epochs=600, batch_size=1, validation_data=(inputs_validate, outputs_validate))
`
Metadata
Assignees
Labels
No labels
Activity