Graph Structure of Neural Networks
From statwiki
Contents
- 1 Presented By
- 2 Introduction
- 3 Relational Graph
- 4 Parameter Definition
- 5 Experimental Setup (Section 4 in the paper)
- 6 Major Conclusions (Section 5 in the paper)
- 6.1 1. Neural Networks Performance Depends on its Structure
- 6.2 2. Sweet spot where performance is significantly improved
- 6.3 3. neural network’s performance is approximately a smooth function of the clustering coefficient and average path length of its relational graph
- 6.4 4. Consistency among many different tasks and datasets
- 6.5 5. top architectures can be identified efficiently
- 6.6 6. well-performing neural networks have graph structure surprisingly similar to those of real biological neural networks
- 7 Critique
Presented By
Xiaolan Xu, Robin Wen, Yue Weng, Beizhen Chang
Introduction
We develop a new way of representing a neural network as a graph, which we call relational graph. Our key insight is to focus on message exchange, rather than just on directed data flow. As a simple example, for a fixedwidth fully-connected layer, we can represent one input channel and one output channel together as a single node, and an edge in the relational graph represents the message exchange between the two nodes (Figure 1(a)).
Relational Graph
Parameter Definition
(1) Clustering Coefficient
(2) Average Path Length