While heavily evolving neural networks, I decided that I need a clever and simple way to depict what the network looks like. So I started researching and I found in my dusty pile of papers (by dusty I mean well organised and by pile my digital archive maintained by two different kinds of software, Zotero and Mendeley, papers still means papers) a really cool paper (Sporns & Tononi, 2001) about Classes of Network Connectivity and Dynamics. Yay.
In this very excellent paper, is a great idea about how to visualise a neural network’s structural and functional connectivity. Structural connectivity, obviously, is the sum total of all actual connections between neurons and functional connectivity is the totality of connections that make neurons part of similar, or the same, processes. I decided to test this idea and below is a sample.
Both of these pictures are produced by reordering the covariance matrix of the neural network using a clustering algorithm. What this does is that it brings closer neurons that are functionally related. This way we get a depiction of the functional circuitry in the network. Using this transformation, we can also reorder the structure of the network and get a visualisation of what that functional circuitry looks like in terms of structure. Here is a simple version of the Python code I’m using to do the matrix reordering.
It doesn’t look like much right now, but if I use the right criterion for evolution, I could perhaps see lobes and sub-populations forming around specific functionality. This could also help in discerning what structural and functional differences neural networks that benefit most from noise have from all the rest.
I have to admit though that even if it is hard to understand what the above pictures are showing, the symmetry is enthralling.
 O. Sporns and G. Tononi, “Classes of network connectivity and dynamics,” Complexity, vol. 7, Sep. 2001, pp. 28-38.