I trained the model for 1000 epochs, this is the output and loss of the last 100 epochs.
The result is the same although I trained the model for more than 1000 epochs. I tried to increase the depth from 2 to 3 and 4 but the result is always this and the training is slower. I also tried to change the learning rate, increasing it for the generator (that has a higher loss) for instance from lg=0.3 to lg=0.9 and ld=0.01. I tried to keep the same learning rate for both generator and discriminator lr=0.0002. Another test that I did is using an Adam optimizer instead of the SGD, but needlessly.
The normalization and the division of the generator in channels gave some improvement to the output, but it seems not to evolve anymore. I am thinking of other possible tests. Do you have any other suggestions?