Data re-uploading impelementation in hybrid NN with keras layer

I have found a temporary solution.
It worked!

I pushed layer-dimension on 3rd-dimension of weight.

@qml.qnode(dev, diff_method='adjoint')
def qnode(inputs, weights):
    for i in range(layers):
        qml.templates.AngleEmbedding(inputs, wires=range(n_qubits))
        qml.templates.StronglyEntanglingLayers(weights[:,:,3*i:3*(i+1)], wires=range(n_qubits))
    return [qml.expval(qml.PauliZ(i)) for i in range(n_qubits)]

weight_shapes = {"weights": (1, n_qubits,3*layers)}
Epoch 1/30
1/1 [==============================] - 0s 140ms/step - loss: 0.8418 - val_loss: 0.1298
Epoch 2/30
1/1 [==============================] - 0s 65ms/step - loss: 0.1251 - val_loss: 0.1253
Epoch 3/30
1/1 [==============================] - 0s 65ms/step - loss: 0.0714 - val_loss: 0.1183

No error occurs.

To check if the model is correct, I have investigated the number of trainable parameters.

model.summary()
Model: "sequential_14"
_________________________________________________________________
Layer (type)                 Output Shape              Param #   
=================================================================
dense_28 (Dense)             (1, 1)                    2         
_________________________________________________________________
keras_layer_14 (KerasLayer)  (1, 1)                    6         
_________________________________________________________________
dense_29 (Dense)             (1, 1)                    2         
=================================================================
Total params: 10
Trainable params: 10
Non-trainable params: 0

Since a StronglyEntanglingLayer requires 3 trainable parameters, the number of trainable parameters should be 6 for two StronglyEntanglingLayer.
Therefore, the above result is reasonable!

But the implementation would not be beautiful.
If there is some better idea, please teach me.