I am considering creating a customized neural network. The basic structure is the same as usual, but I want to truncate the connections between layers. For example, if I construct a network with two hidden layers, I would like to delete some weights and keep the others, like so:
This is not conventional dropout
(to avoid overfitting), since the remaining weights (connections) should be specified and fixed.
Are there any ways in python to do it? Tensorflow, pytorch, theano or any other modules?