Difference between revisions of "Graph Structure of Neural Networks"
From statwiki
(→Major Conclusions (Section 5 in the paper)) |
|||
Line 22: | Line 22: | ||
= Experimental Setup (Section 4 in the paper) = | = Experimental Setup (Section 4 in the paper) = | ||
− | = | + | = Discussions and Conclusions = |
Revision as of 20:05, 10 November 2020
Contents
- 1 Presented By
- 2 Introduction
- 3 Relational Graph
- 4 Parameter Definition
- 5 Experimental Setup (Section 4 in the paper)
- 6 Discussions and Conclusions
- 6.1 1. Neural networks performance depends on its structure
- 6.2 2. Sweet spot where performance is significantly improved
- 6.3 3. Relationship between neural network’s performance and parameters
- 6.4 4. Consistency among many different tasks and datasets
- 6.5 5. top architectures can be identified efficiently
- 6.6 6. well-performing neural networks have graph structure surprisingly similar to those of real biological neural networks
- 7 Critique
Presented By
Xiaolan Xu, Robin Wen, Yue Weng, Beizhen Chang
Introduction
We develop a new way of representing a neural network as a graph, which we call relational graph. Our key insight is to focus on message exchange, rather than just on directed data flow. As a simple example, for a fixedwidth fully-connected layer, we can represent one input channel and one output channel together as a single node, and an edge in the relational graph represents the message exchange between the two nodes (Figure 1(a)).
Relational Graph
Parameter Definition
(1) Clustering Coefficient
(2) Average Path Length