There is more to perceptrons than hinted at in the early Minsky and Papert work.
GAN to GAN communication and backpropagation.
Discussion of strategies as number of layers approaches infinity.
Use of stochastic processe models in neural networks is common. In this article I show how the probability density function collapses with epidemic spread of fault in the network.
The problem of loose coupling and how fuzzy Feynman diagrams can be turned into a priori descriptors of neural pathways.