Graph Structure of Neural Networks: Difference between revisions
Jump to navigation
Jump to search
No edit summary |
|||
Line 29: | Line 29: | ||
== 2. Sweet spot where performance is significantly improved == | == 2. Sweet spot where performance is significantly improved == | ||
== 3. neural network’s performance | == 3. Relationship between neural network’s performance and parameters == | ||
== 4. Consistency among many different tasks and datasets == | == 4. Consistency among many different tasks and datasets == |
Revision as of 18:23, 10 November 2020
Presented By
Xiaolan Xu, Robin Wen, Yue Weng, Beizhen Chang
Introduction
We develop a new way of representing a neural network as a graph, which we call relational graph. Our key insight is to focus on message exchange, rather than just on directed data flow. As a simple example, for a fixedwidth fully-connected layer, we can represent one input channel and one output channel together as a single node, and an edge in the relational graph represents the message exchange between the two nodes (Figure 1(a)).
Relational Graph
Parameter Definition
(1) Clustering Coefficient
(2) Average Path Length